On Measure-Theoretic aspects of Nonextensive Entropy Functionals and corresponding Maximum Entropy Prescriptions

نویسندگان

  • Ambedkar Dukkipati
  • Shalabh Bhatnagar
  • M. Narasimha Murty
چکیده

Shannon entropy of a probability measure P , defined as − ∫ X dP dμ ln dP dμ dμ on a measure space (X,M, μ), is not a natural extension from the discrete case. However, maximum entropy (ME) prescriptions of Shannon entropy functional in the measure-theoretic case are consistent with those for the discrete case. Also it is well known that Kullback-Leibler relative entropy can be extended naturally to measuretheoretic case. In this paper, we study the measure-theoretic aspects of nonextensive (Tsallis) entropy functionals and discuss the ME prescriptions. We present two results in this regard: (i) we prove that, as in the case of classical relative-entropy, the measure-theoretic definition of Tsallis relative-entropy is a natural extension of its discrete case, and (ii) we show that ME-prescriptions of measure-theoretic Tsallis entropy are consistent with the discrete case with respect to a particular instance of ME.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On Measure Theoretic definitions of Generalized Information Measures and Maximum Entropy Prescriptions

X dP dμ ln dP dμ dμ on a measure space (X,M, μ), does not qualify itself as an information measure (it is not a natural extension of the discrete case), maximum entropy (ME) prescriptions in the measure-theoretic case are consistent with that of discrete case. In this paper, we study the measure-theoretic definitions of generalized information measures and discuss the ME prescriptions. We prese...

متن کامل

On Generalized Measures of Information with Maximum and Minimum Entropy Prescriptions

Kullback-Leibler relative-entropy or KL-entropy of P with respect to R defined as ∫ X ln dP dR dP , where P and R are probability measures on a measurable space (X,M), plays a basic role in the definitions of classical information measures. It overcomes a shortcoming of Shannon entropy – discrete case definition of which cannot be extended to nondiscrete case naturally. Further, entropy and oth...

متن کامل

On some entropy functionals derived from Rényi information divergence

We consider the maximum entropy problems associated with Rényi Q-entropy, subject to two kinds of constraints on expected values. The constraints considered are a constraint on the standard expectation, and a constraint on the generalized expectation as encountered in nonextensive statistics. The optimum maximum entropy probability distributions, which can exhibit a power-law behaviour, are der...

متن کامل

Spectral Graph Theoretic Analysis of Tsallis Entropy-based Dissimilarity Measure

In this paper we introduce a nonextensive quantum information theoretic measure which may be defined between any arbitrary number of density matrices, and we analyze its fundamental properties in the spectral graph-theoretic framework. Unlike other entropic measures, the proposed quantum divergence is symmetric, matrix-convex, theoretically upper-bounded, and has the advantage of being generali...

متن کامل

Navier - Stokes Equations for Generalized

Tsallis has proposed a generalization of Boltzmann-Gibbs thermostatistics by introducing a family of generalized nonextensive entropy functionals with a single parameter q. These reduce to the extensive Boltzmann-Gibbs form for q = 1, but a remarkable number of statistical and thermodynamic properties have been shown to be q-invariant { that is, valid for any q. In this paper, we address the qu...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007